west germany - Dictionary definition and meaning for word
west germany
Definition (noun) a republic in north central Europe on the North Sea; established in 1949 from the zones of Germany occupied by the British and French and Americans after the German defeat; reunified with East Germany in 1990
We are looking for Content Writers (1-2 years experience) for our ed-tech startup based out of Gurgaon. If interested, please reach out to us at career@opencubicles.com